Forward-Backward Greedy Algorithms for General Convex Smooth Functions over A Cardinality Constraint
ثبت نشده
چکیده
We consider forward-backward greedy algorithms for solving sparse feature selection problems with general convex smooth functions. A state-of-the-art greedy method, the ForwardBackward greedy algorithm (FoBa-obj) requires to solve a large number of optimization problems, thus it is not scalable for large-size problems. The FoBa-gdt algorithm, which uses the gradient information for feature selection at each forward iteration, significantly improves the efficiency of FoBa-obj. In this paper, we systematically analyze the theoretical properties of both forward-backward greedy algorithms. Our main contributions are: 1) We derive better theoretical bounds than existing analyses regarding FoBaobj for general smooth convex functions; 2) We show that FoBa-gdt achieves the same theoretical performance as FoBa-obj under the same condition: restricted strong convexity condition. Our new bounds are consistent with the bounds of a special case (least squares) and fills a previously existing theoretical gap for general convex smooth functions; 3) We show that the restricted strong convexity condition is satisfied if the number of independent samples is more than k̄ log d where k̄ is the sparsity number and d is the dimension of the variable; 4) We apply FoBa-gdt (with the conditional random field objective) to the sensor selection problem for human indoor activity recognition and our results show that FoBa-gdt outperforms other methods (including the ones based on forward greedy selection and L1-regularization).
منابع مشابه
Forward-Backward Greedy Algorithms for General Convex Smooth Functions over A Cardinality Constraint
We consider forward-backward greedy algorithms for solving sparse feature selection problems with general convex smooth functions. A state-of-the-art greedy method, the ForwardBackward greedy algorithm (FoBa-obj) requires to solve a large number of optimization problems, thus it is not scalable for large-size problems. The FoBa-gdt algorithm, which uses the gradient information for feature sele...
متن کاملOn a combination of forward-backward and Douglas-Rachford algorithms for image recovery problems
The objective of this paper is to develop methods for solving image recovery problems subject to constraints on the solution. More precisely, we will be interested in problems which can be formulated as the minimization over a closed convex constraint set of the sum of two convex functions f and g, where f may be non-smooth and g is differentiable with a Lipschitz-continuous gradient. To reach ...
متن کاملMaximizing non-monotone submodular set functions subject to different constraints: Combined algorithms
We study the problem of maximizing constrained non-monotone submodular functions and provide approximation algorithms that improve existing algorithms in terms of either the approximation factor or simplicity. Our algorithms combine existing local search and greedy based algorithms. Different constraints that we study are exact cardinality and multiple knapsack constraints. For the multiple-kna...
متن کاملGreedy Algorithms for Cone Constrained Optimization with Convergence Guarantees
Greedy optimization methods such as Matching Pursuit (MP) and Frank-Wolfe (FW) algorithms regained popularity in recent years due to their simplicity, effectiveness and theoretical guarantees. MP and FW address optimization over the linear span and the convex hull of a set of atoms, respectively. In this paper, we consider the intermediate case of optimization over the convex cone, parametrized...
متن کاملWeakly Submodular Maximization Beyond Cardinality Constraints: Does Randomization Help Greedy?
Submodular functions are a broad class of set functions, which naturally arise in diverse areas such as economics, operations research and game theory. Many algorithms have been suggested for the maximization of these functions, achieving both strong theoretical guarantees and good practical performance. Unfortunately, once the function deviates from submodularity (even slightly), the known alg...
متن کامل